351 research outputs found

    The Microeconomics of Poverty Traps in Mexico

    Get PDF
    Macroeconomists, development scholars, and policy makers have long recognized the importance of poverty traps as a mayor cause of persistent inequality and a serious limitation to growth. A poverty trap may be defined as a threshold level below which individuals or households will not increase their well-being despite the conditions of the economy. While the importance of poverty traps is widely accepted, their microfoundations (the rationality) behind them are not very well understood. Under the Mexican setting, this paper contributes in two ways. First, we assume that income depends on the capital (both physical and human) that a household posses. Hence, if a household is poor and it is not able to accumulate capital it will remain poor (unless there is a sudden increase to the returns of its existing capital). Thus a poverty trap will be generated. Following Chavas (2004, 2005) we explicitly model the preferences, consumption, and the physical and human capital accumulation of Mexican households. We argue that the typical dynamic model with additive utilities and constant discount rates will not be able to capture poverty traps. The reason is that survival motives are involved (endogenous discounting is needed). Second, employing the same model, we test the impact of the Mexican government most important social policy program (Progresa-Oportunidades), in alleviating poverty traps. In the case of households with youngsters, this program can provide funds conditioned on kids attending school. This will somehow, force the participants to increase their human capital. A comparison between households in the programs versus non participants should shed some light in the effectiveness of the program and the sensitivity of persistent poverty to cash transfers.

    Exhaustive enumeration unveils clustering and freezing in random 3-SAT

    Full text link
    We study geometrical properties of the complete set of solutions of the random 3-satisfiability problem. We show that even for moderate system sizes the number of clusters corresponds surprisingly well with the theoretic asymptotic prediction. We locate the freezing transition in the space of solutions which has been conjectured to be relevant in explaining the onset of computational hardness in random constraint satisfaction problems.Comment: 4 pages, 3 figure

    Tropical Cyclone Cold Wake Size and Its Applications to Power Dissipation and Ocean Heat Uptake Estimates

    Get PDF
    Mixing of the upper ocean by the wind field associated with tropical cyclones (TCs) creates observable cold wakes in sea surface temperature and may potentially influence ocean heat uptake. The relationship between cold wake size and storm size, however, has yet to be explored. Here we apply two objective methods to observed daily sea surface temperature data to quantify the size of TC-induced cold wakes. The obtained cold wake sizes agree well with the TC sizes estimated from the QuikSCAT-R wind field database with a correlation coefficient of 0.51 and 0.59, respectively. Furthermore, our new estimate of the total cooling that incorporates the variations in the cold wake size provides improved estimates of TC power dissipation and TC-induced ocean heat uptake. This study thus highlights the importance of cold wake size in evaluating the climatological effects of TCs

    On the role of information in decision making: the case of sorghum yield in Burkina Faso

    Get PDF
    This paper investigates the role of temporal uncertainty and information issues in economic decisions. It shows that the nature of the economic environment (e.g., the production technology) can influence the valuation of information, which in turn affects the choice functions. This is illustrated by an empirical application to sorghum yield response analysis in Burkina Faso. The paper stresses the importance of technology and information valuation in risk behaviou

    Survey-propagation decimation through distributed local computations

    Full text link
    We discuss the implementation of two distributed solvers of the random K-SAT problem, based on some development of the recently introduced survey-propagation (SP) algorithm. The first solver, called the "SP diffusion algorithm", diffuses as dynamical information the maximum bias over the system, so that variable nodes can decide to freeze in a self-organized way, each variable making its decision on the basis of purely local information. The second solver, called the "SP reinforcement algorithm", makes use of time-dependent external forcing messages on each variable, which let the variables get completely polarized in the direction of a solution at the end of a single convergence. Both methods allow us to find a solution of the random 3-SAT problem in a range of parameters comparable with the best previously described serialized solvers. The simulated time of convergence towards a solution (if these solvers were implemented on a distributed device) grows as log(N).Comment: 18 pages, 10 figure

    Phase Transitions and Computational Difficulty in Random Constraint Satisfaction Problems

    Full text link
    We review the understanding of the random constraint satisfaction problems, focusing on the q-coloring of large random graphs, that has been achieved using the cavity method of the physicists. We also discuss the properties of the phase diagram in temperature, the connections with the glass transition phenomenology in physics, and the related algorithmic issues.Comment: 10 pages, Proceedings of the International Workshop on Statistical-Mechanical Informatics 2007, Kyoto (Japan) September 16-19, 200

    On the cavity method for decimated random constraint satisfaction problems and the analysis of belief propagation guided decimation algorithms

    Full text link
    We introduce a version of the cavity method for diluted mean-field spin models that allows the computation of thermodynamic quantities similar to the Franz-Parisi quenched potential in sparse random graph models. This method is developed in the particular case of partially decimated random constraint satisfaction problems. This allows to develop a theoretical understanding of a class of algorithms for solving constraint satisfaction problems, in which elementary degrees of freedom are sequentially assigned according to the results of a message passing procedure (belief-propagation). We confront this theoretical analysis to the results of extensive numerical simulations.Comment: 32 pages, 24 figure
    • …
    corecore